专利摘要:
The present invention relates to a mobile terminal and its control method. The mobile terminal (100) includes a display (151) and a controller (180) configured to display on the display (151) a home screen on which at least one icon is displayed to execute an application corresponding to an icon selected from the at least one icon and to display at least one of a shape, color and size of the at least one icon according to an execution environment of the application. According to the present invention, a user can intuitively recognize an execution state of an application because an icon corresponding to the application may be displayed differently depending on application execution environments.
公开号:FR3028063A1
申请号:FR1554401
申请日:2015-05-18
公开日:2016-05-06
发明作者:Minhun Park;Jongbeom Kim;Hyukjae Jang
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present invention claims the benefit of the prior filing date and the right of priority for the Korean application No. 10-2014-0149588, filed October 30, 2014. The present invention relates to a mobile terminal and in particular, a mobile terminal configured to allow a user to intuitively recognize an execution state of an application by displaying differently an icon corresponding to the application according to the execution environments of the application and a method for controlling it. Due to the diversification of terminal functions such as personal computers, laptops, cell phones, terminals become multimedia players having multiple functions for capturing images or moving images, reproducing music files, images animated and games and receive broadcast programs. The terminals can be classified as mobile terminals and fixed terminals. Mobile terminals may further include handheld terminals and onboard terminals depending on whether users can personally wear the terminals. Conventional terminals including mobile terminals offer an increasing number of complex and diverse functions. To support and improve the increasing number of functions in a terminal, an improvement of a structural part and / or a software part of the terminal would be desirable. The present invention provides a mobile terminal configured to allow a user to intuitively recognize an execution state of an application by displaying differently an icon corresponding to the application depending on the application's execution environments. and a method for controlling it. The accompanying drawings, which are included to provide a better understanding of the invention and are incorporated and form part of this specification, illustrate one or more embodiments of the invention and, together with the description, serve to explain the principle of the invention. of the invention. In the drawings: FIG. 1A is a block diagram of a mobile terminal according to one embodiment; Figure 1B is a perspective front view of the mobile terminal according to one embodiment; Figure 1C is a perspective rear view of the mobile terminal according to one embodiment; Figure 2 is a conceptual view of a deformable mobile terminal according to an alternative embodiment of the present invention; Fig. 3 is a flowchart illustrating an operation of the mobile terminal according to one embodiment of the present invention; Figures 4 and 5 illustrate an operation of the camera of the mobile terminal shown in Figure 3; Figures 6, 7 and 8 illustrate display states of an icon corresponding to the camera of the mobile terminal shown in Figure 3; Figures 9 to 19 illustrate the manipulation of the camera of the mobile terminal shown in Figure 3 and the icon corresponding thereto; Figures 20 to 23 illustrate an operation corresponding to another application of the mobile terminal shown in Figure 3; and Fig. 24 illustrates an operation corresponding to another application of the mobile terminal shown in Fig. 3. Arrangements and embodiments can now be described more fully with reference to the accompanying drawings, in which illustrative embodiments may be represented. Embodiments may, however, be implemented in many different forms and should not be construed as limited to the embodiments set forth herein; on the contrary, the embodiments can be provided so that this memory is detailed and complete and fully convey the concept to the skilled person. A mobile terminal may be described below with reference to the accompanying drawings. In the following description, suffixes "module" and "unit" may be given to components of the mobile terminal only in consideration of ease of description, and they do not have any meaning or function that is different from other.
[0002] The mobile terminal may include a cellular phone, a smart phone, a laptop, a digital broadcast terminal, personal digital assistants (PDAs), a portable media player (PMP), a navigation system, and so on.
[0003] Figure lA is a block diagram of a mobile terminal according to one embodiment. Other embodiments, configurations and arrangements may also be provided. As shown, the mobile terminal 100 may comprise a wireless communication unit 110 (or radio communication unit), an audio / video (AN) input unit 120, a user input unit 130, a detection unit 140 , an output unit 150, a memory 160, an interface 170, a controller 180 and a power supply 190. The components shown in FIG. 1A may be essential parts and / or a plurality of components included in the mobile terminal 100 may change. The components of the mobile terminal 100 can now be described. The wireless communication unit 110 may comprise at least one module that allows radio communication between the mobile terminal 100 and a radio communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. the wireless communication unit 110 may comprise a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114 (or local communication module), and a location information module 115 (or position information module). The broadcast receiving module 111 may receive broadcast signals and / or broadcast related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel, and the broadcast management server may be a server that generates and transmits broadcast signals and / or broadcast-related information or a server that receives broadcast signals. broadcast previously created and / or broadcast-related information and transmits the broadcast signals and / or broadcast-related information to a terminal. The broadcast signals may comprise not only TV broadcast signals, radio broadcast signals and data broadcast signals but also signals in the form of a combination of a TV broadcast signal and a signal. radio broadcast. The broadcast-related information may be information on a broadcast channel, a broadcast program or a broadcast service provider and may be provided even via a mobile communication network. In the latter case, the information related to the broadcast can be received by the mobile communication module 112. The information related to the broadcast can exist in different forms. For example, broadcast-related information may exist in the form of an electronic program guide (EPG) of a digital media delivery system (DMB) or in the form of an electronic service guide (ESG). a portable digital video broadcasting system (DVB-H). The broadcast receiving module 111 may receive broadcast signals using different broadcast systems. More particularly, the broadcast receiving module 111 can receive digital broadcast signals using digital broadcasting systems such as a digital terrestrial multimedia broadcasting system (DMB-T), a digital multimedia broadcasting system (DMB) -S), a single system of direct multimedia link (MediaFLO), DVB-H systems and digital terrestrial integrated services (ISDB-T). The broadcast receiving module 111 may receive signals from broadcast systems providing broadcast signals other than the digital broadcast systems described above. The broadcast signals and / or broadcast-related information received through the broadcast receiving module 111 may be stored in the memory 160. The mobile communication module 112 may transmit / receive a radio signal to or from at least one base station, an external terminal and a server on a mobile communication network. The radio signal may include a voice call signal, a video call signal or data in different forms depending on the transmission and receipt of text / multimedia messages. The wireless Internet module 113 may correspond to a module for wireless Internet access and may be included in the mobile terminal 100 or may be externally connected to the mobile terminal 100. Wireless LAN (WLAN or Wi-Fi), broadband wireless (Wibro), global interoperability for microwave access (Wimax), high-speed downlink packet access (HSDPA) etc. can be used as a wireless Internet technique. The short-range communication module 114 may correspond to a module for short-range communication. In addition, Bluetooth0, radio frequency identification (RFLD), association for infrared (IrDA) data transmission, ultra-wideband (UWB) and / or ZigBee® can be used as a short-range communication technique. The location information module 115 can confirm or obtain a location or position of the mobile terminal 100. The location information module 115 can obtain position information using a Global Navigation Satellite System (GNSS). GNSS is a terminology describing a satellite-based radionavigation system that transmits around the earth and transmits reference signals to predetermined types of radionavigation receivers so that radionavigation receivers can determine their positions on the surface of the earth or near the surface of the earth. GNSS can include a US Global Positioning System (GPS), Europe's Galileo, Russia's Global Orbit Satellite Navigation System (GLONASS), China's COMPASS, and a near-zenith satellite system (QZSS) ) from Japan, for example. A global positioning system (GPS) module is a representative example of the location information module 115. The GPS module can calculate information on distances between a point or object and at least three satellites and information on a time at which the distance information is measured and trigonometry is applied to the obtained distance information to obtain three-dimensional position information of the point or object as a function of latitude, longitude and altitude at a predetermined time. A method for calculating position and time information using three satellites and correcting the position and time information calculated using another satellite may also be used. In addition, the GPS module can continuously calculate a current position in real time and calculate velocity information using the location or position information.
[0004] The input unit AN 120 can input (or receive) an audio signal and / or a video signal. The A / V input unit 120 may comprise a camera 121 and a microphone 122. The camera 121 may process still picture frames or moving picture frames obtained by an image sensor in a picture mode. videophone or a mode of photography. The processed image frames may be displayed on the display 151, which may be a touch screen. The image frames processed by the camera 121 may be stored in the memory 160 or may be transmitted to an external device via the wireless communication unit 110. The mobile terminal 100 may also include at least two 121. The microphone 122 may receive an external audio signal in a calling mode, a recording mode and / or a voice recognition mode, and the microphone 122 may process the received audio signal into electrical audio data. The audio data can then be converted into a form that can be transmitted to a mobile communication base station through the mobile communication module 112 and output in the calling mode. The microphone 122 may use different noise reduction algorithms (or noise suppression algorithm) to eliminate or attenuate noise generated when the external audio signal is received. The user input unit 130 may receive input data from a user to control the operation of the mobile terminal 100. The user input unit 130 may include a keyboard, a dome switch, a touchpad (voltage / constant capacity), selection wheel, selection switch and / or etc. The detection unit 140 can detect a current state of the mobile terminal 100, such as an open / closed state of the mobile terminal 100, a position of the mobile terminal 100, the fact that a user touches the mobile terminal 100, a direction of the mobile terminal 100 and acceleration / deceleration of the mobile terminal 100, and the detection unit 140 can generate a detection signal to control the operation of the mobile terminal 100. For example, in an example of a telephone with a cover , the detection unit 140 can detect whether the lid phone is open or closed. In addition, the detection unit 140 can detect whether the power supply 190 is supplying power and / or the fact that the interface 170 is connected to an external device. The detection unit 140 may also include a proximity sensor 141. The detection unit 140 may detect a movement of the mobile terminal 100. The output unit 150 may generate a visual, audible and / or tactile output and 5 l. output unit 150 may include the display 151, an audio output module 152, an alarm 153, and a haptic module 154. The display 151 may display information processed by the mobile terminal 100. The display 151 may display a display. user interface (UI) and / or a graphical user interface (GUI) relating to a telephone call when the mobile terminal 100 is in the calling mode. The display 151 may also display a captured and / or received image, UI or GUI when the mobile terminal 100 is in the videophone mode or the photography mode. The display 151 may comprise at least one liquid crystal display, a thin film transistor liquid crystal display, an organic electroluminescent diode display, a flexible display, and / or a three-dimensional display. Display 151 may be of a transparent type or a light transmitting type. Specifically, the display 151 may include a transparent display. The transparent display can be a transparent LCD. A back structure of the display 151 may also be of a light transmitting type. Therefore, a user can see an object located behind the body (of the mobile terminal 100) through the transparent area of the body of the mobile terminal 100 that is occupied by the display 151. The mobile terminal 100 may also include at least two displays. For example, the mobile terminal 100 may include a plurality of displays 151 that are arranged on a single face at a predetermined distance or integrated displays. The plurality of displays 151 may also be arranged on different sides. When the display 151 and a touch sensor (hereinafter referred to as a touch sensor) form a laminated structure which is designated as a touch screen, the display 151 may be used an input device in addition to an output device. The touch sensor may be in the form of a touch film, a touch sheet and / or a touchpad, for example.
[0005] The touch sensor may convert a pressure variation applied to a specific portion of the display 151 or a capacitance variation generated at a specific portion of the display 151 to an electrical input signal. The touch sensor can detect a touch pressure as well as a position and a surface of the touch. When the user applies touch input to the touch sensor, a signal corresponding to the touch input can be transmitted to a touch controller. The touch controller can then process the signal and transmit data corresponding to the processed signal to the controller 180. Accordingly, the controller 180 can detect an affected portion of the display 151. The proximity sensor 141 (of the sensor unit 140 ) can be positioned in an internal region of the mobile terminal 100, surrounded by the touch screen, and / or near the touch screen. The proximity sensor 141 can detect an object approaching a predetermined detection face or an object located near the proximity sensor 141 using an electromagnetic force or infrared rays without having a mechanical contact. The proximity sensor 141 may have a longer life than a contact sensor and may thus have a wide application in the mobile terminal 100. The proximity sensor 141 may comprise a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, mirror reflection type photoelectric sensor, high frequency oscillating proximity sensor, capacitive proximity sensor, magnetic proximity sensor and / or infrared proximity sensor . A capacitive touch screen can be constructed so that the proximity of a pointer is detected through a variation of an electric field depending on the proximity of the pointer. The touch screen (touch sensor) can be classified as a proximity sensor 141. For ease of explanation, an action of the pointer approaching the touch screen without actually touching the touch screen can be designated as a touch of proximity and an action of bringing the pointer into contact with the touch screen can be referred to as touch contact. The touch point of proximity of the pointer on the touch screen may correspond to a point on the touch screen at which the pointer is perpendicular to the touch screen.
[0006] The proximity sensor 141 can detect proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, proximity touch motion state, etc.). Information corresponding to the proximity tap action and the detected proximity touch pattern can then be displayed on the touch screen. The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal receiving mode, a telephone call mode or a recording mode, a voice recognition mode and a broadcast reception mode. The audio output module 152 may output audio signals relating to functions, such as an incoming call signal tone and an incoming message tone, performed in the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a vibrator and / or the like. The audio output module 152 may output sounds through a headphone jack. The user can hear the sounds by connecting an earphone to the earphone jack. The alarm 153 may output a signal to indicate the generation of an event of the mobile terminal 100. For example, an alarm may be generated upon receipt of a call signal, receipt of a message, the input of a key signal and / or the input of a touch. The alarm 153 may also output signals in forms different from those of the video signals or audio signals, for example, a signal to indicate the generation of an event by means of a vibration. The video signals and / or audio signals may also be outputted through the display 151 or the audio output module 152. The haptic module 154 may generate various haptic effects that the user may experience. An example of haptic effects is a vibration. An intensity and / or vibration pattern generated by the haptic module 154 can also be controlled. For example, different vibrations may be combined and output or may be sequentially outputted. The haptic module 154 can generate a variety of haptic effects including a stimulus effect as a function of a vertically movable pin arrangement against the contact skin surface, a stimulus effect as a function of a jet force or an air suction force through a jet orifice or a suction port, a skin friction stimulus effect, a stimulus effect as a function of electrode contact, a stimulus effect using an electrostatic force and an effect as a function of hot and cold reproduction using an element capable of absorbing or radiating heat in addition to vibrations. The haptic module 154 may not only transmit haptic effects through direct contact, but may also allow the user to experience haptic effects through a kinesthetic sensation of the user's fingers or arms. . The mobile terminal 100 may also include a plurality of haptic modules 154. The memory 160 may store a program for operations of the controller 180 and / or temporarily store input / output data such as a telephone directory, still pictures and / or moving pictures. The memory 160 may also store vibration and sound data in different schemes that are output when a touch input is applied to the touch screen. The memory 160 may comprise at least one flash memory, a hard disk-type memory, a micro-multimedia card type memory, a card-type memory, such as an SD or XD memory, a random access memory (RAM), a RAM static (SRAM), a read only memory (ROM), a programmable and electrically erasable ROM (EEPROM), a programmable ROM (PROM), a magnetic memory, a magnetic disk and / or an optical disk. The mobile terminal 100 can also operate in connection with a web storage that performs a storage function of the memory 160 on the Internet. The interface 170 may serve as a path to external devices connected to the mobile terminal 100. The interface 170 may receive data from external devices or a power supply and transmit the data or power supply to internal components of the terminal mobile 100 or transmit data from the mobile terminal 100 to the external devices. For example, the interface 170 may include a wired or wireless headset port, an external charger port, a wired or wireless data port, a memory card port, a port for connecting a device having a memory module. user ID, audio I / O port, video I / O port and / or earphone. The interface 170 may also interface with a user identification module which is a chip that stores information to authenticate an authority to use the mobile terminal 100. For example, the user identification module may be a module User Identification Number (UIM), Subscriber Identification Module (SIM) and / or Universal Subscriber Identification Module (USIM). An identification device (including the user identification module) can also be implemented in the form of a smart card. Therefore, the identification device may be connected to the mobile terminal 100 via a port of the interface 170. The interface 170 may also be a path through which electrical energy from a medium external is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to the external medium or a path through which various control signals entered by the user through the medium are transmitted to the mobile terminal 100. The various control signals or the power input from the medium can be used as signals to confirm that the mobile terminal 100 is properly installed in the medium. The controller 180 may control global operations of the mobile terminal 100. For example, the controller 180 may perform voice communication, data communication and / or video telephony control and processing. The controller 180 may also include a multimedia module 181 for multimedia reproduction. The media module 181 may be included in the controller 180 or may be separate from the controller 180. The controller 180 may perform a pattern recognition process capable of recognizing a handwriting input or drawing input applied to the screen touch like characters or pictures. The power supply 190 may receive external energy and internal energy and provide the energy required for the operation of the components of the mobile terminal 100 under the control of the controller 180. Depending on the hardware implementation, embodiments may be provided. implemented using at least one of application-specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), networks Field Programmable Gate Array (FPGA), processors, controllers, microcontrollers, microprocessors, and / or electrical units for performing functions. Embodiments may be implemented by the controller 180. According to the software implementation, embodiments such as procedures and functions may be implemented with a separate software module that performs at least one function or operation. . Software codes can be implemented based on a software application written in an appropriate software language. Software codes can be stored in memory 160 and executed by controller 180. Figure 1B is a perspective front view of a mobile terminal (or portable terminal) according to one embodiment. The mobile terminal 100 may be a bar-type terminal body.
[0007] However, embodiments are not limited to a bar-type terminal and may be applied to terminals of different types including sliding-type, folding-type, swing-type and / or swing-type terminals having at least two bodies that are movably combined with each other. The terminal body may comprise a housing (housing, housing, cover, etc.) that forms an outer member of the mobile terminal 100. In this embodiment, the housing can be divided into a front housing 101 and a housing. rear housing 102. Various electronic components may be arranged in the space formed between the front housing 101 and the rear housing 102. At least one central housing may be further disposed between the front housing 101 and the rear housing 102. The housings can be formed of plastic materials by injection molding or made of a metallic material such as stainless steel (STS) or titanium (Ti). The display 151, the audio output unit 152, the camera 121, the user input unit 130/131 and 132, the microphone 122, and the interface 170 may be arranged (or arranged) in the body of the display. terminal and, more specifically, may be arranged (or arranged) in the front housing 101. The display 151 may occupy most of the main face of the front housing 101. The audio output unit 152 and the camera 121 may be arranged in a region near one of the two ends of the display 151 and the user input unit 131 and the microphone 122 may be positioned in a region near another end of the display 151. The user input unit 132 and the interface 170 may be arranged (or arranged) on the sides of the front housing 101 and the rear housing 102. The user input unit 130 may receive commands to control the operation of the mobile terminal 100 and pe The functional units 131 and 132 may be designated as manipulation portions and may employ any tactile manner in which a user operates the functional units 131 and 132 while having a plurality of functional units 131 and 132. a tactile sensation. The first and second functional units 131 and 132 may receive various inputs. For example, the first functional unit 131 may receive commands such as start, end, and scroll, and the second functional unit 132 may receive commands such as controlling a sound output volume of the audio output unit 152 or converting the display 151 to a touch recognition mode. Figure 1C is a perspective rear view of the mobile terminal (shown in Figure 1B) according to one embodiment.
[0008] Referring to Figure 1B, a camera 121 'may be further attached to the rear side of the terminal body (i.e. the rear housing 102). The camera 121 'may have a photography direction opposite to that of the camera 121 (shown in Fig. 1B) and may have a different number of pixels from that of the camera 121 (shown in Fig. 1B).
[0009] For example, it may be desirable for the camera 121 to have a reduced number of pixels so that the camera 121 can capture an image of a user's face and transmit the image to a receiving part in case of video telephony while the camera 121 'has a large number of pixels because the camera 121' captures an image of a general object and does not immediately transmit the image in many cases. The cameras 121 and 121 'can be attached (or arranged) on the terminal body so that the cameras 121 and 121' can rotate or arise. A flash lamp 123 and a mirror 124 may also be disposed near the camera 121 '. The flash lamp 123 may illuminate an object when the camera 121 'takes a picture of the object. The mirror 124 can be used by the user to look at his face in the mirror when the user wants to photograph himself using the camera 121 '. An audio output unit 152 'may also be provided on the rear side of the terminal body. The audio output unit 152 'can perform a stereo function with the audio output unit 152 (shown in Figure 1B) and can be used for a speakerphone mode when the terminal is used for a telephone call . A broadcast signal receiving antenna may also be attached (or disposed) on the terminal body side in addition to an antenna for telephone calls. The antenna constituting a part of the broadcast receiving module 111 (shown in Figure 1A) can be installed in the terminal body so that the antenna can be extracted from the terminal body. the energy at the mobile terminal 100 may be installed in the terminal body, the power supply 190 may be included in the terminal body or may be removably attached to the terminal body, and a touch pad 135 for detecting the touch may to be attached to the back box 102. The touch pad 135 may be of a light transmission type, such as the display 151. In this example, if the display 151 outputs visual information by means of its two faces , the visual information can be recognized (or determined) by the touchpad 135. The information output from both sides of the display 151 can be controlled by the pad Alternatively, a display may also be attached (or disposed) on the touch pad 135 so that a touch screen may be arranged (or arranged) also in the back box 102. The touch pad 135 may operate in conjunction with the display 151 of the front housing 101. The touchpad 135 may be positioned in parallel with the display 151 behind the display 151. The touch panel 135 may be the same size or smaller than the display 151. FIG. a view to explain a depth of proximity of a proximity sensor.
[0010] As shown in Figure 3, when a pointer (such as a finger of the user) approaches the touch screen, the proximity sensor located in or near the touch screen can detect the approaching the pointer and can output a proximity signal The proximity sensor may be constructed such that the proximity sensor outputs a proximity signal according to a distance between the pointer approaching the touch screen and the touch screen (referred to as "proximity depth"). The distance at which the proximity signal is output when the pointer approaches the touch screen may be referred to as a detection distance. The proximity depth can be determined by using a plurality of proximity sensors having different sensing distances and comparing the proximity signals delivered by the proximity sensors respectively. FIG. 2 is a conceptual view of a deformable mobile terminal according to FIG. an alternative embodiment of the present invention. In this figure, the mobile terminal 200 is shown with a display unit 251, which is a display type that is deformable by an external force. This deformation, which includes the display unit 251 and other components of the mobile terminal 200, may include any of curvature, bending, folding, twisting, rolling, and combinations thereof. The deformable display unit 251 may also be referred to as a "flexible display unit". In some implementations, the flexible display unit 251 may include a general flexible display, electronic paper (also known as e-paper), and combinations thereof. In general, the mobile terminal 200 may be configured to include features that are the same or similar to those of the mobile terminal 100 of Figs. 1A-1C. The flexible display of the mobile terminal 200 is generally formed as a non-brittle light display, which still has features of a conventional flat screen display, but is instead manufactured on a flexible substrate that can be deformed as previously indicated. The term electronic paper can be used to refer to a display technology employing the feature of a generic ink and is different from the conventional flat panel display for the purpose of using reflected light. Electronic paper is generally understood as a change of information displayed using a spinning ball or electrophoresis using a capsule. In a state in which the flexible display unit 251 is not deformed (for example, in a state with an infinite radius of curvature and designated as a first state), a display region of the display unit Flexible display 251 includes a generally flat surface. In a state in which the flexible display unit 251 is deformed with respect to the first state by an external force (e.g., a state with a finite radius of curvature and designated as a second state), the display region can become a curved surface or a curved surface. As illustrated, information displayed in the second state may be visual information outputted to the curved surface. The visual information can be realized in such a way that a light emission of each unit pixel (subpixel) arranged in a matrix configuration is controlled independently. The unit pixel designates an elementary unit to represent a color. According to an alternative embodiment, the first state of the flexible display unit 251 may be a curved state (for example, a curvature state from top to bottom or from right to left), instead of being in a flat state. In this embodiment, when an external force is applied to the flexible display unit 251, the flexible display unit 251 can move into the second state so that the flexible display unit is deformed in the flat state (or a less curved state) or in a more curved state. If desired, the flexible display unit 251 can implement a flexible touch screen using a touch sensor in combination with the display. When a touch is received at the flexible touch screen, the controller 180 may execute a certain command corresponding to the touch input. In general, the flexible touch screen is configured to detect a touch and other inputs in both the first and second states. One option is to configure the mobile terminal 200 to include a deformation sensor that detects the deformation of the flexible display unit 251. The deformation sensor may be included in the detection unit 140.
[0011] The deformation sensor may be positioned in the flexible display unit 251 or the housing 201 to detect information relating to the deformation of the flexible display unit 251. Examples of such information relating to the deformation of the Flexible display unit 251 may be a deformation direction, a deformation degree, a deformation position, a deformation time amount, an acceleration with which the deformed flexible display unit 251 is restored and the like. Other possibilities include virtually any type of information that can be detected in response to the curvature of the flexible display unit or detected while the flexible display unit 251 passes into, or is within , the first and second states. In some embodiments, the controller 180 or other component may change information displayed on the flexible display unit 251 or generate a control signal to control a function of the mobile terminal 200, based on the deformation information. of the flexible display unit 251. Such information is generally detected by the deformation sensor. The mobile terminal 200 is shown with a housing 201 for housing the flexible display unit 251. The housing 201 can be deformable together with the flexible display unit 251, taking into account the characteristics of the display unit A battery (not shown in this figure) positioned in the mobile terminal 200 may also be deformable in cooperation with the flexible display unit 261, taking into account the characteristics of the flexible display unit 251. One technique for implementing such a battery is to use a stacking and folding method for stacking battery cells. The deformation of the flexible display unit 251 is not limited to execution by an external force. For example, the flexible display unit 251 may be deformed in the second state from the first state by user control, application control, or the like. Figure 3 is a flowchart illustrating an operation of the mobile terminal according to one embodiment of the present invention. Referring to Figure 3, the controller 180 of the mobile terminal 100 may indicate a state of an application by means of an icon corresponding to the application. As a result, a user can recognize the status of the application before running the application and / or viewing the icon displayed on a home screen. The controller may enter a home screen display state displaying at least one icon (S10).
[0012] At least one icon can be displayed on the home screen. Each icon may correspond to a specific application. Precisely, when an icon is selected, a specific application corresponding to it can be executed. The home screen can be displayed in different environments. For example, when the power supply is again applied and / or the display 151 is turned on in a standby state and when the execution of a specific application is completed, the home screen can be displayed. The controller 180 can detect a state of an application corresponding to a specific icon (S20). The state of the application may be related to a state of the mobile terminal 100. The state of the application may be related to a state of a device that constitutes the mobile terminal 100. The state of the application may be be linked to a functional state of an application executed in the mobile terminal 100. For example, in the case of an application for capturing images using the camera 121 of the mobile terminal 100, the controller 180 can detect a current state of the camera 121. In other words, the controller 180 can detect whether the camera 121 is in a forward capture state or a back capture state, a filter applied to photograph using the cameras 121 and the like. As another example, the controller 180 can detect a state of a web browser application installed in the mobile terminal 100. Precisely, the controller 180 can detect a state in which the execution of the web browser application has been completed and similar. The controller 180 may display the icon corresponding to the application so that the icon indicates the state of the application (S30). When the icon corresponding to the application indicates the status of the application, a user can easily recognize an application startup status by selecting the icon that will be started by viewing the icon. For example, when a photography icon is selected to control the camera 121, the user can recognize whether the camera 121 is in a forward capture mode, a back capture mode, a still image capture mode, or a movie mode. capture video by viewing the display status of the icon. Figures 4 and 5 illustrate an operation of the mobile terminal shown in Figure 3.
[0013] As shown, the mobile terminal 100 according to one embodiment of the present invention may comprise a plurality of cameras 121. With reference to FIG. 4, a front camera 121a and a rear camera 121b may be provided respectively on the front and the front. Back of the mobile terminal 100. In general, the front camera 121a can be used by the user of the mobile terminal 100 to take a photographic self-portrait and the rear camera 121b can be used to capture external objects such as other people. Referring to Figure 5, an image may be captured using the front camera 121a at a time t1. Specifically, an image of the user of the mobile terminal 100 can be captured using the front camera 121a. A camera application may be terminated at a time t2. The camera application can be executed again at a time t3. The camera application may operate one of the front camera 121a and the rear camera 121b according to a previous setting value. For example, when the front camera 121a is used at time t1 and then the camera application 20 is completed at time t2 without changing the camera mode, the front camera 121a can be used again at time t3 . When the front camera 121a is operated at time t3, the user of the mobile terminal 100 may be hindered by an unexpected capture direction. For example, an image of the user handling the mobile terminal 100 is unexpectedly captured by the front camera 121a and thus the user may be embarrassed. On the other hand, the user may have to perform additional manipulation to operate the rear camera 121b instead of the front camera 121a. Figures 6, 7 and 8 illustrate display states of an icon corresponding to the camera of the mobile terminal shown in Figure 3. As shown, the controller 180 of the mobile terminal 100 according to one embodiment of the present invention can indicate a current state of the mobile terminal 100 by means of the icon.
[0014] Referring to Figure 6 (a), the mobile terminal 100 may be in a state in which the front camera 121a is in an activation waiting state. For example, when the operation of capturing an image using the front camera 121a was performed immediately before the current operation, the front camera 121a can be activated when executing the camera application. The controller 180 may display a first icon Ila when the front camera 121a is in the activation waiting state. The first icon 11a may be an icon representing that the lens of the camera is facing forward. Therefore, the user can intuitively recognize that the front camera 121a will be operated when the camera application is executed in the current state. Referring to Figure 6 (b), when the rear camera 121b of the mobile terminal 100 is in the activation waiting state, the controller 180 may display a second icon Ilb. The second icon Ilb can assume a shape corresponding to the rear profile of a generic camera. Therefore, the user can intuitively recognize that the rear camera 121b will be activated when the camera application is executed when he observes the second icon Ilb. Referring to Figure 7, the display 151 may display a home screen BG. The BG Home screen may display I icons corresponding to applications. The icons displayed may include an icon relating to the camera 121.
[0015] The controller 180 may display the first icon 11a corresponding to the camera 121. The first icon 11a may indicate a generic camera lens. As a result, the user can intuitively recognize that the photograph will be taken in a direction in which the camera lens is directed towards the user when he observes the first icon 11a. Precisely, the user can recognize that the front camera 121a of the mobile terminal 100 will be activated. Referring to Figure 8, the display 151 may display the home screen BG. The controller 180 may change the icon relating to the camera 121 so that the icon corresponds to the current camera state when it displays the icon. For example, the controller 180 may display the second icon Ilb representing the back profile of a generic camera. When displaying the second icon Ilb, the user can intuitively recognize that the photograph will be taken with the camera lens facing forward. Specifically, the user can recognize that the rear camera 121b of the mobile terminal 100 will be activated. Figures 9 to 19 illustrate a manipulation in relation to the mobile terminal camera shown in Figure 3 and the icon corresponding thereto.
[0016] As shown, the controller 180 of the mobile terminal 100 according to one embodiment of the present invention can indicate a current state of the camera 121 constituting the mobile terminal through the shape of the icon corresponding to the camera 121. As shown in Figure 9, the controller 180 may change the icon display state according to a resolution of the camera 121. For example, the controller 180 may clearly display the first or second icon 11a or 11b when the resolution of the camera 121 is high, as shown in Figure 9 (a). When the resolution of the camera 121 is changed, the controller 180 may change the definition of the first or second icon 11a or 11b in response to the change of resolution. For example, the controller 180 may reduce the definition of the first or second icon 11a or 11b in response to a decrease in the resolution of the camera 121, as shown in Figures 9 (b) and 9 (c). Therefore, the user can intuitively recognize the current resolution of the camera 121 by observing the definition of the first or second icon 11a or 11b. Referring to Figure 10, the controller 180 may change the display of the icon according to the operation mode of the camera 121. For example, the controller 180 may display the first icon 11a by taking a generic camera profile. in the still image capture mode, as shown in Figure 10 (a), and display the second icon I2a by taking a projector profile in the video capture mode, as shown in Figure 10 (b). Therefore, the user can intuitively recognize the current state of the camera based on the profile of the modified icon according to the photography mode, differentiated from the conventional method in which the photography mode is recognized after the execution of the camera application when selecting the corresponding icon. Referring to Figure 11, the controller 180 may perform a function depending on the size of a displayed icon.
[0017] Referring to Figure 11 (a), the controller 180 may display the first icon 11a. When selecting the first icon 11a, the controller 180 may operate the front camera 121a. Referring to Figure 11 (b), the user can manipulate the first displayed icon 11a. For example, the user can change the size of the first displayed icon 11a using first and second fingers Fi and F2. Referring to Figure 11 (c), the first icon 11a can be changed to the second icon I2a having a different size. The controller 180 may perform various operations based on user touch points when the first icon 11a is changed to the second icon I2a. For example, the controller 180 may operate the front camera 121a when a first region A1 is touched and operate the rear camera 121b when a second region A2 is touched. Precisely, even in the case of an icon, different operations can be performed according to selected portions of the icon. Referring to Figure 12, the controller 180 may display different pieces of information when an icon size is changed. Referring to Figure 12 (a), the user can touch the first displayed icon 11a. For example, the user can change the size of the first icon 11a by using the first and second fingers Fi and F2. Referring to Figure 12 (b), when changing the icon size, the controller 180 may display information corresponding to the changed size. For example, when the first icon 11a is changed to the second icon I2a with a different size, related information RI corresponding to the size of the second icon I2a can be displayed. The related information RI may be information concerning the current state of the camera 121. For example, the related information RI may comprise at least one of information concerning a camera to be used between the front camera and the rear camera, a resolution, a type of filter applied to the camera and a shutter speed. Referring to Figure 13, the controller 180 can change a functional state of an application when the size of an icon corresponding to the application is changed.
[0018] The user can touch the first icon displayed lia. For example, the user can perform a touch change operation of the size of the first icon 11a using the first and second fingers Fi and F2. The controller 180 can change the type of a filter to be applied to the camera 5 when the first icon 11a is changed in the second to fifth icons I2a-I5a having different sizes. Therefore, the disadvantage of changing a filter after running the camera application can be eliminated. The controller 180 may change colors from the second to the fifth icons I2a to I5a depending on the type of filter. For example, the controller 180 may indicate a color change, which may occur when a filter corresponding to an icon size is applied, through the corresponding icon. Referring to Figure 14, the controller 180 may perform different functions depending on the number and / or types of touches applied to the icon. For example, when the first icon 11a is touched once, the controller 180 may execute the camera application corresponding to the first icon 11a. When the first icon 11a is touched twice, the controller 180 can change the camera to a first setting value. When the first icon 11a is touched three times, the controller 180 can change the camera to a second setting value. The first and second adjustment values may be predetermined by the user and / or the controller 180. For example, the first and second adjustment values may be setting values including the type of a predetermined filter, the resolution, Shutter speed and the like. Since the camera setting can be changed by simply touching the icon, the manipulation of the mobile terminal 100 can be simplified. Referring to Figure 15, an execution status of an application corresponding to a displayed icon may be changed by a user touch applied to the icon. Referring to Figure 15 (a), the first icon 11a may be displayed. The first icon 11a may correspond to a state in which an image is captured using the front camera 121a. Referring to Figure 15 (b), the user can perform a touch operation on the first displayed icon 11a. The touch operation can be a scanning operation of the first icon 11a using a finger F.
[0019] Referring to Figure 15 (c), the controller may change the first icon 11a to the second icon Ilb in response to the user's touch operation. The second icon Ibl may be an icon corresponding to a state in which an image is captured using the rear camera 121b. Specifically, when the second Ibl icon is selected, the controller 180 can begin capturing an image using the 12 lb rear camera. Figures 16 and 17, the controller 180 of the mobile terminal 100 according to one embodiment of the present invention may display an additional icon corresponding to a changed state of a specific application.
[0020] Referring to Figure 16 (a), the user may operate the camera by selecting the first icon 11a at a time t1. Referring to Figure 16 (b), when the user selects the first icon 11a at a time t2, the controller 180 can operate the camera before 121a. Specifically, an image of the user holding the mobile terminal 100 can be captured using the front camera 121a. Referring to Figure 16 (c), the user can change the state of the camera application so that the rear camera 121b is used at a time t3. Precisely, a front image of the user can be captured using the rear camera 121b.
[0021] Referring to Figure 17, the user may terminate the camera application at a time t4. Precisely, the photographing operation using the rear camera 121b is complete. Referring to Figure 17 (a), at a time t5 when the camera application has been completed, the controller 180 may display the second icon Ibl in addition to the first icon 11a displayed prior to the execution of the camera application. The first icon 11a may correspond to a default setting of the camera. For example, a default setting state may correspond to the execution of the camera application using the front camera 121a. When the default setting is changed by user manipulation, the controller 180 may additionally display an icon representing a changed value. For example, the controller 180 may additionally display the second icon Ilb indicating the photograph using the 12 lb rear camera, which is different from the default setting. Therefore, the user can select the first icon 11a when he wants to use the camera before 121a based on default value and select the second icon Ilb when he wants to use the rear camera 121b. The first and second icons 11a and 11b can assume different shapes and the user can thus intuitively recognize states by means of the icons corresponding thereto. Referring to Figure 18, the controller 180 may display related icons as a group. Specifically, the controller 180 may display related icons in a group so that the user can intuitively recognize a group of different states of related functions from the icon group. Referring to Figure 18 (a), the first and second icons 11a and 11b can be displayed. The first and second icons 11a and 11b may be icons respectively corresponding to different states of the camera. Referring to Figure 18 (b), the controller 180 may display the first and second icons 11a and 11b as a group of icons. For example, the controller 180 may display a tray T including the first and second icons 11a and 11b. The user can perform a touching operation of the tray T using a finger F. For example, the user can touch and slide one side of the tray T. Referring to Figure 18 (c), the controller 180 can change icon positions on the T tray in response to the user's touch operation. For example, while the first icon 11a was positioned before the second icon Ilb before the touch operation, the second icon Ilb can be positioned after the first icon 11a after the touch operation. Referring to Figure 19, the controller 180 may perform different operations depending on the conclusion types of a specific application. In other words, the controller can store or not a modified adjustment value according to a type of application conclusion. For example, buttons B may be provided on mobile terminal 100. Buttons B may comprise a home button B1 and a return button B2. When the user selects the home button Bi, the controller 180 can terminate the currently running application and display the home screen. The controller 180 can store a current setting value of the camera when selecting the home button Bi. If the camera is used successively, the controller 180 can operate the camera based on the stored value. For example, the controller 180 may change a photography direction of the camera, a filter included in the camera, a resolution thereof, etc., based on the setting value when the home button B1 is operated. and operate the camera. When the user selects the return button B2, the controller 180 can terminate the currently running application. The controller 180 may cancel the current camera setting value when selecting the return button B2. Precisely, the controller 180 may terminate the operation of the camera without storing the current camera setting value. Therefore, when the camera is used later, the camera can operate based on the initial default setting value. Figures 20 to 23 illustrate an operation corresponding to another application of the mobile terminal shown in Figure 3. As shown, the mobile terminal 100 according to one embodiment of the present invention can indicate states of an application by means of an icon in different ways. Referring to Figure 20 (a), the first icon 11a may correspond to a web browser application. Specifically, when selecting the first icon 11a, the web browsing application can be displayed on the display 151. The controller 180 can display a blank BP page which is not linked to a specific web page when of the selection of the first icon lia. When the blank page 25 BP of the web browser application is displayed, the user can enter a web address in the blank BP page for web browsing. Referring to Figure 20 (b), the second icon Ilb may be provided. The second Ilb icon may correspond to the web browser application. Specifically, the web browser application may be displayed when selecting the second icon Ilb. The controller 180 may display a specific HP page when selecting the second icon Ilb. Precisely, the empty BP page is not displayed. The HP specific page may be a page that has been previously viewed by the user.
[0022] For example, the HP specific page may be a web page that has been accessed using the web browser application. The second icon Ilb can be displayed differently from the first icon 11a. Specifically, at least one of the color, shape and size of the second icon Ilb may differ from that of the first icon 11a although the user may intuitively recognize that the second icon Ilb is related to the application of web browsing. With reference to Figure 21, icons may be displayed in different forms.
[0023] With reference to Figure 21 (a), an icon may be displayed differently depending on the states of an application corresponding to the icon. For example, the first icon 11a may correspond to a first web page, the second icon Ilb may correspond to a second web page and a third icon Ilc may correspond to a third web page. While the general shapes of the first, second, and third icons 11a, 11b, and 11c are similar, at least one of their colors, shapes, and sizes may be different. Therefore, the user can intuitively recognize that the web browser application is executed when selecting the icons and that the first, second and third icons 11a, 11b and 11c respectively correspond to different web pages.
[0024] With reference to Figure 21 (b), the icons can be displayed as a group of icons. Specifically, a plurality of icons corresponding to the same attribute and / or application can be displayed as a group. For example, the first, second and third icons 11a, 11b and 11c may be icons corresponding to the web browser application. The first, second and third icons 11a, 11b and 11c can respectively correspond to different web pages. When new icons are generated based on a user's operation and / or a control operation of the controller 180, the controller 180 may display the generated icons as a group. The controller 180 may display the icons in a tray-like region so that the user can intuitively recognize that the icons are grouped. Referring to Figure 22 (a), the user may manipulate the first, second, and third icons 11a, 11b, and 11c grouped in the T-bin. For example, the user may perform a touch-scan operation starting at On one side of the tray T. Referring to Figure 22 (b), the controller 180 may change icon positions in response to the user's touch operation. For example, the controller 180 may move the third icon Il c, which was positioned behind the first icon 11a before the touch operation, so that the third icon Ilc is positioned in front. This can provide a more intuitive environment for use because a plurality of icons are positioned in a tray-like region. Precisely, because icons are positioned in a region assuming the shape of a T-tray that can be rotated, the user can easily manipulate the icons. Referring to Figure 22 (c), the controller 180 may display information corresponding to the icon positioned in front of other icons. For example, when the third icon Ilc is positioned in front of other icons in the tray T, a PVP information window may be displayed. The PVP information window may display the content of a web page corresponding to the third icon Ilc in the form of a preview image. Referring to Figure 23 (a), the user may touch the first icon 11a with a finger F. The controller 180 may perform different operations depending on the properties of the touch operation. For example, the controller 180 can execute the application linked to the corresponding icon when entering a single brief touch. The controller 180 may display a menu for programming a function of the icon when entering an extended touch. Referring to Figure 23 (b), the controller 180 may display a menu for programming a function corresponding to a specific touch operation applied to the first icon 11a. For example, the controller 180 may perform a function of initialization of a stored web page, program a secret mode so that the corresponding web page can be consulted only when a password or the like is entered or generate a new corresponding tray to new groups of icons. Figure 24 illustrates an operation corresponding to another application of the mobile terminal shown in Figure 3.
[0025] As shown, the controller 180 of the mobile terminal 100 according to one embodiment of the present invention can generate icons corresponding to control values of various external devices. Referring to Fig. 24 (a), a first icon 11a through which a handling application for an external device 200 may be executed may be provided. For example, the user can execute a remote controller application to operate the external device 200 by selecting the first icon 11a. With reference to Fig. 24 (b), an user-preferred setting value ST for the specific external device 200 or ultimately used may be present. For example, when the external device 200 is an air conditioner, a preferred temperature and speed may be present. The controller 180 can store the specific setting value ST. The controller 180 may display a second icon Ilb indicating that the specific setting value ST has been stored. The second icon Ilb may have a shape similar to the first icon 11a so that the user can intuitively recognize that the second icon Ilb and the first icon 11a relate to the same function. However, at least one of the color, size and additional indication of the second icon Ilb may be different from that of the first icon 11a so that the user can intuitively recognize that the external device 200 is used with the specific setting value ST when selecting the second icon Ilb. The mobile terminal control method described above can be written as computer programs and can be implemented in digital microprocessors that execute the programs using a computer readable recording medium. The control method of the mobile terminal can be executed by software. The software may include segments of code that perform necessary tasks. Programs or code segments may also be stored on a processor readable medium or may be transmitted based on a computer data signal combined with a carrier via a transmission medium or communication network. The computer readable recording medium may be any data storage device that can store data that can then be read by a computer system. Examples of the computer readable recording medium may include: ROM, RAM, CD-ROM, DVD ± ROM, DVD-RAM, magnetic tapes, floppy disks, optical data storage devices. The computer readable recording medium may also be distributed on computer systems coupled to the network such that the computer readable code is stored and executed in a distributed manner. A mobile terminal may include a first touch screen configured to display a first object, a second touch screen configured to display a second object and a controller configured to receive a first touch input applied to the first object and to connect the first object to a corresponding function. to the second object upon receipt of a second touch input applied to the second object while the first touch input is maintained. A method of controlling a mobile terminal may be provided, which includes displaying a first object on the first touch screen, displaying a second object on the second touch screen, receiving a first input touch applied to the first object and the connection of the first object to a function corresponding to the second object when a second touch input applied to the second object is received while the first touch input is maintained.
[0026] Any reference in this specification to "an embodiment," "exemplary embodiment," etc., means that a particular feature, structure, or feature described in connection with the embodiment is included in at least one embodiment of the invention. the invention. The appearances of these sentences in various places in the memory are not necessarily all with reference to the same embodiment. In addition, when a particular feature, structure or characteristic is described in relation to any embodiment, it is understood that it is within the abilities of those skilled in the art to realize such a feature, structure or characteristic in relation to with other embodiments. Although embodiments have been described with reference to a plurality of illustrative embodiments thereof, it should be understood that many other modifications and embodiments may be contemplated by one of ordinary skill in the art, which will be within the scope of the present invention. the spirit and framework of the principles of this description. More particularly, various variations and modifications are possible in the components and / or arrangements of the object arrangement combination, within the scope of the description, drawings and appended claims. In addition to variations and modifications in the components and / or the arrangements, variations of use will be obvious to those skilled in the art.
权利要求:
Claims (15)
[0001]
REVENDICATIONS1. A mobile terminal (100), comprising: a display (151) configured to display information; and a controller (180) configured to: cause the display (151) to display a home screen on which at least one icon is present; execute an application corresponding to an icon among the at least one icon; and having the display (151) display the icon among the at least one icon in at least one shape, color or size that is variable based on an execution state of the application.
[0002]
The mobile terminal (100) of claim 1, wherein the at least one shape, color or size corresponds to a previous execution state of the application.
[0003]
Mobile terminal (100) according to claim 1, wherein the icon among the at least one icon is changed from a first icon (11a) corresponding to the application to a second icon (Ilb) corresponding to the application when the execution state is changed from a first state to a second state so that at least one shape, color, or size of the second icon (Ilb) is different from the at least one shape, color, or size of the first icon (lia).
[0004]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to cause the display (151) to display a first icon (11a) having at least one shape, color or size corresponding to a first execution state of the application; and displays a second icon (Ilb) in addition to the first icon when the execution state is changed from the first execution state to a second execution state so that two different icons corresponding to the icon among the at least one icon is present on the home screen, the second icon (11a) having at least one shape, color or size corresponding to the second execution state of the application.
[0005]
The mobile terminal (100) of claim 4, wherein the controller (180) is further configured to cause the display (151) to display the first and second icons (11a, 11b) as a group in a specific region of the home screen and causing the display to change positions of the first and second icons (11a, 11b) in the specific region in response to a touch input applied to the specific region.
[0006]
The mobile terminal (100) of claim 1, further comprising: a front camera (121a); and a rear camera (12 lb), wherein: the application is an image capture application; the icon among the at least one icon is displayed as a first icon (11a) corresponding to the front camera (121a) or a second icon (11b) corresponding to the rear camera (121b) on the basis of a state execution of the image capture application; and the controller (180) is further configured to activate the front camera (121a) in response to an input applied to the first icon (11a) and to activate the rear camera (121b) in response to an input applied to the second icon (He b).
[0007]
The mobile terminal (100) of claim 1, wherein the execution state comprises at least one resolution of a camera, attributes of an image captured using the camera or a type of web page.
[0008]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to cause the display (151) to display information corresponding to a current execution state of the application by responding to an input applied to the icon among the at least one icon so that the information and the icon among the at least one icon are displayed together on the home screen.
[0009]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to: change a size of the icon from the at least one icon in response to a touch input applied to the icon among the at least one icon; and run the application in a changed execution state based on the changed size.
[0010]
A method of controlling a mobile terminal (100), the method comprising: displaying a home screen on which at least one icon is present; executing an application corresponding to one of the at least one icon; and displaying the icon among the at least one icon in at least one shape, color or size that is variable based on an execution state of the application.
[0011]
The method of claim 10, wherein the at least one of the form, color or size corresponds to a previous execution state of the application.
[0012]
The method of claim 10, further comprising changing the icon among the at least one icon of a first icon (11a) corresponding to the application to a second icon (Ilb) corresponding to the application when the execution state is changed from a first state to a second state such that at least one shape, color or size of the second icon (II b) is different than the at least one shape, color or size of the first icon (lia).
[0013]
The method of claim 10, further comprising: displaying a first icon (11a) having at least one shape, color or size corresponding to a first execution state of the application; and displaying a second icon (Ilb) in addition to the first icon (11a) when the execution state is changed from the first execution state to a second execution state so that two different icons corresponding at least one icon are present on the home screen, the second icon (Ilb) having at least one shape, color or size corresponding to the second execution state of the application.
[0014]
The method of claim 10, further comprising displaying the first and second icons (11a, 11b) as a group in a specific region of the home screen and changing positions of the first and second icons (11a). It b) in the specific region in response to a touch input applied to the specific region.
[0015]
The method of claim 10, wherein: the mobile terminal (100) comprises a front camera (121a) and a rear camera (121b); the application is an image capture application; and the method further comprises: displaying the icon among the at least one icon as a first icon (11a) corresponding to the front camera (121a) or a second icon (Ib) corresponding to the rear camera ( 121b) based on the execution state of the image capture application; and activating the front camera (121a) in response to an input applied to the first icon (11a) and activating the rear camera (121b) in response to an input applied to the second icon (Ilb).
类似技术:
公开号 | 公开日 | 专利标题
FR3028063A1|2016-05-06|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3026201A1|2016-03-25|
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
US8854315B2|2014-10-07|Display device having two touch screens and a method of controlling the same
FR3022367A1|2015-12-18|
RU2419831C2|2011-05-27|Method of controlling switched three-dimensional user interface and mobile terminal using said interface
US8904291B2|2014-12-02|Mobile terminal and control method thereof
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3022649A1|2015-12-25|
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3019665A1|2015-10-09|
FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME
FR3021135A1|2015-11-20|
US20150091794A1|2015-04-02|Mobile terminal and control method therof
FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022648A1|2015-12-25|
FR3041447A1|2017-03-24|
FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日
CN106155286A|2016-11-23|
EP3015966A2|2016-05-04|
FR3028063B1|2018-10-12|
CN106155286B|2021-03-26|
EP3015966A3|2016-07-06|
KR20160051986A|2016-05-12|
KR102215997B1|2021-02-16|
US10516828B2|2019-12-24|
US20160127652A1|2016-05-05|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
GB2350991A|1998-11-13|2000-12-13|Ibm|Identifying and closing running application programs easily|
US6104397A|1997-06-30|2000-08-15|Sun Microsystems, Inc.|Method and system for generating improved progress indicators|
JP4730571B2|2000-05-01|2011-07-20|ソニー株式会社|Information processing apparatus and method, and program storage medium|
JP4518716B2|2001-09-25|2010-08-04|三洋電機株式会社|Digital camera|
CN101299802B|2004-03-19|2011-03-02|株式会社理光|Apparatus with display unit, information-processing method|
JP4181206B2|2004-03-19|2008-11-12|株式会社リコー|Electronic device with photographing function, control method of electronic device with photographing function, and program|
KR101058011B1|2004-10-01|2011-08-19|삼성전자주식회사|How to Operate Digital Camera Using Touch Screen|
US7685530B2|2005-06-10|2010-03-23|T-Mobile Usa, Inc.|Preferred contact group centric interface|
KR101371414B1|2007-01-31|2014-03-10|삼성전자주식회사|Combination A/V apparatus with multi function and method for providing UI|
US9086785B2|2007-06-08|2015-07-21|Apple Inc.|Visualization object receptacle|
CN101616199A|2008-06-27|2009-12-30|深圳富泰宏精密工业有限公司|Multiple modes of operation switched system and method|
KR101555055B1|2008-10-10|2015-09-22|엘지전자 주식회사|Mobile terminal and display method thereof|
US8769428B2|2009-12-09|2014-07-01|Citrix Systems, Inc.|Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine|
EP3882750A1|2010-01-20|2021-09-22|Nokia Technologies Oy|User input|
KR101690232B1|2010-05-28|2016-12-27|엘지전자 주식회사|Electronic Device And Method Of Controlling The Same|
KR20120019244A|2010-08-25|2012-03-06|삼성전자주식회사|Control method of a plurality of attribute and portable device thereof|
US9204026B2|2010-11-01|2015-12-01|Lg Electronics Inc.|Mobile terminal and method of controlling an image photographing therein|
CN102004687A|2010-11-04|2011-04-06|东莞宇龙通信科技有限公司|Mobile terminal and program running state display method of same|
GB2486238A|2010-12-08|2012-06-13|Wolfson Microelectronics Plc|A user interface for controlling a device using an icon|
US9360991B2|2011-04-11|2016-06-07|Microsoft Technology Licensing, Llc|Three-dimensional icons for organizing, invoking, and using applications|
CN102207823A|2011-05-13|2011-10-05|宇龙计算机通信科技有限公司|Display method and device for application program|
JP2013080345A|2011-10-03|2013-05-02|Kyocera Corp|Device, method, and program|
JP5834895B2|2011-12-26|2015-12-24|ブラザー工業株式会社|Image processing apparatus and program|
US20130238973A1|2012-03-10|2013-09-12|Ming Han Chang|Application of a touch based interface with a cube structure for a mobile device|
KR102070276B1|2012-04-24|2020-01-28|엘지전자 주식회사|Mobile terminal and control method for the mobile terminal|
CN102722321A|2012-05-22|2012-10-10|中兴通讯股份有限公司|Method and device for switching between double cameras|
KR102155708B1|2013-02-26|2020-09-14|삼성전자 주식회사|Portable terminal and method for operating multi-application thereof|
KR102072584B1|2013-04-19|2020-02-03|엘지전자 주식회사|Digital device and method for controlling the same|
KR20150136801A|2014-05-28|2015-12-08|삼성전자주식회사|User Interface for Application and Device|
JP6343194B2|2014-07-08|2018-06-13|キヤノン株式会社|COMMUNICATION DEVICE, ITS CONTROL METHOD, AND PROGRAM|
WO2016039570A1|2014-09-12|2016-03-17|Samsung Electronics Co., Ltd.|Method and device for executing applications through application selection screen|USD726765S1|2013-06-09|2015-04-14|Apple Inc.|Display screen or portion thereof with icon|
US10481759B2|2014-12-29|2019-11-19|Lg Electronics Inc.|Bended display device for controlling function of application through image displayed on sub-region, and control method therefor|
USD779556S1|2015-02-27|2017-02-21|Samsung Electronics Co., Ltd.|Display screen or portion thereof with an icon|
USD771137S1|2015-08-11|2016-11-08|Samsung Electronics Co., Ltd.|Display screen or portion thereof with icon|
CN106484211A|2015-08-28|2017-03-08|富泰华工业(深圳)有限公司|There is electronic installation and the system optimization method of system optimization function|
US9996981B1|2016-03-07|2018-06-12|Bao Tran|Augmented reality system|
CN106502873B|2016-09-30|2019-12-13|维沃移动通信有限公司|Application program state information display method and terminal|
CN106502872B|2016-09-30|2019-11-29|维沃移动通信有限公司|A kind of display methods and terminal of application state information|
KR20180045338A|2016-10-25|2018-05-04|삼성전자주식회사|Portable apparatus and method for controlling a screen|
KR102057566B1|2016-11-24|2019-12-19|주식회사 하이딥|Touch input method for providing uer interface and apparatus|
USD878404S1|2018-02-21|2020-03-17|Early Warning Services, Llc|Display screen portion with graphical user interface for adding profile picture|
JP6865721B2|2018-07-27|2021-04-28|任天堂株式会社|Programs, information processing devices, information processing methods, and information processing systems|
KR20200129584A|2019-05-09|2020-11-18|삼성전자주식회사|Foldable apparatus and method for controlling photographing using a plurality of cameras|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 |
2017-05-30| PLFP| Fee payment|Year of fee payment: 3 |
2018-01-12| PLSC| Search report ready|Effective date: 20180112 |
2018-05-29| PLFP| Fee payment|Year of fee payment: 4 |
2019-04-10| PLFP| Fee payment|Year of fee payment: 5 |
2020-04-08| PLFP| Fee payment|Year of fee payment: 6 |
2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
KR20140149588|2014-10-30|
KR1020140149588A|KR102215997B1|2014-10-30|2014-10-30|Mobile terminal and method for controlling the same|
[返回顶部]